T0pp is an 11-billion-parameter encoder-decoder model based on the T5 architecture, excelling in zero-shot task generalization with English natural language prompts, outperforming GPT-3 while being more compact.
Large Language Model
Transformers English